Output Regularized Metric Learning with Side Information

نویسندگان

  • Wei Liu
  • Steven C. H. Hoi
  • Jianzhuang Liu
چکیده

Distance metric learning has been widely investigated in machine learning and information retrieval. In this paper, we study a particular content-based image retrieval application of learning distance metrics from historical relevance feedback log data, which leads to a novel scenario called collaborative image retrieval. The log data provide the side information expressed as relevance judgements between image pairs. Exploiting the side information as well as inherent neighborhood structures among examples, we design a convex regularizer upon which a novel distance metric learning approach, named output regularized metric learning, is presented to tackle collaborative image retrieval. Different from previous distance metric methods, the proposed technique integrates synergistic information from both log data and unlabeled data through a regularization framework and pilots the desired metric toward the ideal output that satisfies pairwise constraints revealed by side information. The experiments on image retrieval tasks have been performed to validate the feasibility of the proposed distance metric technique.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Effective Approach for Robust Metric Learning in the Presence of Label Noise

Many algorithms in machine learning, pattern recognition, and data mining are based on a similarity/distance measure. For example, the kNN classifier and clustering algorithms such as k-means require a similarity/distance function. Also, in Content-Based Information Retrieval (CBIR) systems, we need to rank the retrieved objects based on the similarity to the query. As generic measures such as ...

متن کامل

Manifold Regularized Transfer Distance Metric Learning

The performance of many computer vision and machine learning algorithms are heavily depend on the distance metric between samples. It is necessary to e xploit abundant of side information like pairwise constraints to learn a robust and reliable distance metric. While in real world application, large quantities of labeled data is unavailable due to the high labeling cost. Transfer distance metri...

متن کامل

Distributed Learning with Regularized Least Squares

We study distributed learning with the least squares regularization scheme in a reproducing kernel Hilbert space (RKHS). By a divide-and-conquer approach, the algorithm partitions a data set into disjoint data subsets, applies the least squares regularization scheme to each data subset to produce an output function, and then takes an average of the individual output functions as a final global ...

متن کامل

Regularized Distance Metric Learning: Theory and Algorithm

In this paper, we examine the generalization error of regularized distance metric learning. We show that with appropriate constraints, the generalization error of regularized distance metric learning could be independent from the dimensionality, making it suitable for handling high dimensional data. In addition, we present an efficient online learning algorithm for regularized distance metric l...

متن کامل

Composite Kernel Optimization in Semi-Supervised Metric

Machine-learning solutions to classification, clustering and matching problems critically depend on the adopted metric, which in the past was selected heuristically. In the last decade, it has been demonstrated that an appropriate metric can be learnt from data, resulting in superior performance as compared with traditional metrics. This has recently stimulated a considerable interest in the to...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008